|
Display lag is a phenomenon associated with some types of liquid crystal displays (LCDs), and nearly all types of high-definition televisions (HDTVs). It refers to latency, or lag measured by the difference between the time there is a signal input, and the time it takes the input to display on the screen. This lag time has been measured as high as 〔(【引用サイトリンク】date= )〕 or the equivalent of 3-4 frames on a 60 Hz display. Display lag is not to be confused with pixel response time. Currently the majority of manufacturers do not include any specification or information about display latency on the screens they produce. == Analog vs digital technology == For older analog cathode ray tube (CRT) technology, display lag is extremely low, due to the nature of the technology, which does not have the ability to store image data before display. The picture signal is minimally processed internally, simply for demodulation from a radio-frequency (RF) carrier wave (for televisions), and then splitting into separate signals for the red, green, and blue electron guns, and for timing of the vertical and horizontal sync. Image adjustments typically involved reshaping the signal waveform but without storage, so the image is written to the screen as fast as it is received, with only nanoseconds of delay for the signal to traverse the wiring inside the device from input to the screen. For modern digital signals, significant computer processing power and memory storage is needed to prepare an input signal for display. For either over-the-air or cable TV, the same analog demodulation techniques are used, but after that, then the signal is converted to digital data, which must be decompressed using the MPEG codec, and rendered into an image bitmap stored in a frame buffer. For progressive scan display modes, the signal processing stops here, and the frame buffer is immediately written to the display device. In its simplest form, this processing may take several microseconds to occur. For interlaced video, additional processing is frequently applied to deinterlace the image and make it seem to be clearer or more detailed than it actually is. This is done by storing several interlaced frames and then applying algorithms to determine areas of motion and stillness, and to either merge interlaced frames for smoothing or extrapolate where pixels are in motion, the resulting calculated frame buffer is then written to the display device. De-interlacing imposes a delay that can be no shorter than the number of frames being stored for reference, plus an additional variable period for calculating the resulting extrapolated frame buffer. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Display lag」の詳細全文を読む スポンサード リンク
|